150 research outputs found

    Kernel Spectral Clustering and applications

    Full text link
    In this chapter we review the main literature related to kernel spectral clustering (KSC), an approach to clustering cast within a kernel-based optimization setting. KSC represents a least-squares support vector machine based formulation of spectral clustering described by a weighted kernel PCA objective. Just as in the classifier case, the binary clustering model is expressed by a hyperplane in a high dimensional space induced by a kernel. In addition, the multi-way clustering can be obtained by combining a set of binary decision functions via an Error Correcting Output Codes (ECOC) encoding scheme. Because of its model-based nature, the KSC method encompasses three main steps: training, validation, testing. In the validation stage model selection is performed to obtain tuning parameters, like the number of clusters present in the data. This is a major advantage compared to classical spectral clustering where the determination of the clustering parameters is unclear and relies on heuristics. Once a KSC model is trained on a small subset of the entire data, it is able to generalize well to unseen test points. Beyond the basic formulation, sparse KSC algorithms based on the Incomplete Cholesky Decomposition (ICD) and L0L_0, L1,L0+L1L_1, L_0 + L_1, Group Lasso regularization are reviewed. In that respect, we show how it is possible to handle large scale data. Also, two possible ways to perform hierarchical clustering and a soft clustering method are presented. Finally, real-world applications such as image segmentation, power load time-series clustering, document clustering and big data learning are considered.Comment: chapter contribution to the book "Unsupervised Learning Algorithms

    Online Learning Algorithm of Direct Support Vector Machine for Regression Based on Matrix Operation

    No full text

    Neurocontrol: An overview

    No full text
    info:eu-repo/semantics/publishe

    Chaos Synchronization: a Lagrange Programming Network Approach

    No full text
    In this paper we interpret chaos synchronization schemes within the framework of Lagrange programming networks, which form a class of continuous-time optimization methods for solving constrained nonlinear optimization problems. From this study it follows that standard synchronization schemes can be regarded as a Lagrange programming network with soft constraining, where synchronization between state vectors is defined as a constraint to the dynamical systems. New schemes are proposed then which implement synchronization by hard and soft constraints within Lagrange programming networks. A version is derived which takes into account synchronization errors within the problem formulation. Furthermore Lagrange programming networks for achieving partial and generalized synchronization are given. Th

    Recurrent least squares support vector machines

    No full text

    Synchronization Theory for Lur'e Systems: An Overview

    No full text
    We present a short overview of synchronization theory for Lur'e systems. Lur'e systems form a class of nonlinear systems with Chua's circuit and certain generalizations (e.g. n-scroll circuits) and such coupled cells as examples. By exploiting the sector conditions on the nonlinear characteristics one can derive sufficient conditions for global asymptotic stability of the error systems. For secure communications using chaos we discuss the nonlinear H1 synchronization method. In the case of Lur'e systems conditions for synchronization can be expressed in terms of matrix inequalities. Controllers for achieving synchronization are designed based upon these matrix inequalities. A similar methodology can be followed for impulsive synchronization schemes
    • …
    corecore